全文获取类型
收费全文 | 16844篇 |
免费 | 1068篇 |
国内免费 | 331篇 |
专业分类
管理学 | 1879篇 |
劳动科学 | 4篇 |
民族学 | 79篇 |
人才学 | 3篇 |
人口学 | 379篇 |
丛书文集 | 1107篇 |
理论方法论 | 408篇 |
综合类 | 8462篇 |
社会学 | 620篇 |
统计学 | 5302篇 |
出版年
2024年 | 14篇 |
2023年 | 119篇 |
2022年 | 212篇 |
2021年 | 233篇 |
2020年 | 377篇 |
2019年 | 482篇 |
2018年 | 543篇 |
2017年 | 682篇 |
2016年 | 581篇 |
2015年 | 633篇 |
2014年 | 975篇 |
2013年 | 2247篇 |
2012年 | 1293篇 |
2011年 | 1152篇 |
2010年 | 955篇 |
2009年 | 945篇 |
2008年 | 1008篇 |
2007年 | 997篇 |
2006年 | 931篇 |
2005年 | 765篇 |
2004年 | 631篇 |
2003年 | 543篇 |
2002年 | 468篇 |
2001年 | 409篇 |
2000年 | 249篇 |
1999年 | 183篇 |
1998年 | 98篇 |
1997年 | 100篇 |
1996年 | 73篇 |
1995年 | 64篇 |
1994年 | 45篇 |
1993年 | 39篇 |
1992年 | 37篇 |
1991年 | 41篇 |
1990年 | 24篇 |
1989年 | 18篇 |
1988年 | 15篇 |
1987年 | 9篇 |
1986年 | 7篇 |
1985年 | 14篇 |
1984年 | 8篇 |
1983年 | 9篇 |
1982年 | 5篇 |
1981年 | 1篇 |
1980年 | 1篇 |
1979年 | 4篇 |
1978年 | 2篇 |
1977年 | 1篇 |
1975年 | 1篇 |
排序方式: 共有10000条查询结果,搜索用时 15 毫秒
991.
为增强安全关键系统的高可信能力,在分析高可信保障机制现状的基础上,提出了一种多层次的高可信软件架构。该架构采用时空分离思想、虚拟机技术,为基于MLS的嵌入式安全关键系统提供了一种整体解决方案。基于该架构,研究了多层次的安全和防危策略管理方法、信息流控制机制、可信软件的评估和认证方法,为安全关键嵌入式系统提供可认证的安全服务。 相似文献
992.
The Bayesian analysis based on the partial likelihood for Cox's proportional hazards model is frequently used because of its simplicity. The Bayesian partial likelihood approach is often justified by showing that it approximates the full Bayesian posterior of the regression coefficients with a diffuse prior on the baseline hazard function. This, however, may not be appropriate when ties exist among uncensored observations. In that case, the full Bayesian and Bayesian partial likelihood posteriors can be much different. In this paper, we propose a new Bayesian partial likelihood approach for many tied observations and justify its use. 相似文献
993.
Antonio Di Crescenzo Maria Longobardi 《Journal of statistical planning and inference》2009,139(12):4072-4087
In analogy with the cumulative residual entropy recently proposed by Wang et al. [2003a. A new and robust information theoretic measure and its application to image alignment. In: Information Processing in Medical Imaging. Lecture Notes in Computer Science, vol. 2732, Springer, Heidelberg, pp. 388–400; 2003b. Cumulative residual entropy, a new measure of information and its application to image alignment. In: Proceedings on the Ninth IEEE International Conference on Computer Vision (ICCV’03), vol. 1, IEEE Computer Society Press, Silver Spring, MD, pp. 548–553], we introduce and study the cumulative entropy, which is a new measure of information alternative to the classical differential entropy. We show that the cumulative entropy of a random lifetime X can be expressed as the expectation of its mean inactivity time evaluated at X. Hence, our measure is particularly suitable to describe the information in problems related to ageing properties of reliability theory based on the past and on the inactivity times. Our results include various bounds to the cumulative entropy, its connection to the proportional reversed hazards model, and the study of its dynamic version that is shown to be increasing if the mean inactivity time is increasing. The empirical cumulative entropy is finally proposed to estimate the new information measure. 相似文献
994.
The variational approach to Bayesian inference enables simultaneous estimation of model parameters and model complexity. An interesting feature of this approach is that it also leads to an automatic choice of model complexity. Empirical results from the analysis of hidden Markov models with Gaussian observation densities illustrate this. If the variational algorithm is initialized with a large number of hidden states, redundant states are eliminated as the method converges to a solution, thereby leading to a selection of the number of hidden states. In addition, through the use of a variational approximation, the deviance information criterion for Bayesian model selection can be extended to the hidden Markov model framework. Calculation of the deviance information criterion provides a further tool for model selection, which can be used in conjunction with the variational approach. 相似文献
995.
996.
Francesco Audrino Peter Bühlmann 《Journal of the Royal Statistical Society. Series B, Statistical methodology》2009,71(3):655-670
Summary. We propose a flexible generalized auto-regressive conditional heteroscedasticity type of model for the prediction of volatility in financial time series. The approach relies on the idea of using multivariate B -splines of lagged observations and volatilities. Estimation of such a B -spline basis expansion is constructed within the likelihood framework for non-Gaussian observations. As the dimension of the B -spline basis is large, i.e. many parameters, we use regularized and sparse model fitting with a boosting algorithm. Our method is computationally attractive and feasible for large dimensions. We demonstrate its strong predictive potential for financial volatility on simulated and real data, and also in comparison with other approaches, and we present some supporting asymptotic arguments. 相似文献
997.
Several models for studies related to tensile strength of materials are proposed in the literature where the size or length
component has been taken to be an important factor for studying the specimens’ failure behaviour. An important model, developed
on the basis of cumulative damage approach, is the three-parameter extension of the Birnbaum–Saunders fatigue model that incorporates
size of the specimen as an additional variable. This model is a strong competitor of the commonly used Weibull model and stands
better than the traditional models, which do not incorporate the size effect. The paper considers two such cumulative damage
models, checks their compatibility with a real dataset, compares them with some of the recent toolkits, and finally recommends
a model, which appears an appropriate one. Throughout the study is Bayesian based on Markov chain Monte Carlo simulation. 相似文献
998.
Donor imputation is frequently used in surveys. However, very few variance estimation methods that take into account donor imputation have been developed in the literature. This is particularly true for surveys with high sampling fractions using nearest donor imputation, often called nearest‐neighbour imputation. In this paper, the authors develop a variance estimator for donor imputation based on the assumption that the imputed estimator of a domain total is approximately unbiased under an imputation model; that is, a model for the variable requiring imputation. Their variance estimator is valid, irrespective of the magnitude of the sampling fractions and the complexity of the donor imputation method, provided that the imputation model mean and variance are accurately estimated. They evaluate its performance in a simulation study and show that nonparametric estimation of the model mean and variance via smoothing splines brings robustness with respect to imputation model misspecifications. They also apply their variance estimator to real survey data when nearest‐neighbour imputation has been used to fill in the missing values. The Canadian Journal of Statistics 37: 400–416; 2009 © 2009 Statistical Society of Canada 相似文献
999.
本文运用多变量动态模型系统下的Beveridge-Nelson分解方法和贝叶斯Gibbs抽样估计,估算了1985年1季度至2008年2季度期间中国的产出缺口,并且与传统的单变量估计方法测算的结果在统计属性和对货币政策调节的预测效果方面进行了比较。实证结果表明,不同产出缺口的统计属性存在差别,并且只有基于多变量系统测算的产出缺口对货币政策具有显著预测效果。这说明多变量模型估计出的产出缺口更全面地考虑了经济产出与其他相关变量的互动效应,含有的信息更为丰富,从而对宏观政策调整具有更重要的参考价值。 相似文献
1000.